Partial covariance based functional connectivity computation using Ledoit-Wolf covariance regularization

نویسندگان

  • Matthew R. Brier
  • Anish Mitra
  • John E. McCarthy
  • Beau M. Ances
  • Abraham Z. Snyder
چکیده

Functional connectivity refers to shared signals among brain regions and is typically assessed in a task free state. Functional connectivity commonly is quantified between signal pairs using Pearson correlation. However, resting-state fMRI is a multivariate process exhibiting a complicated covariance structure. Partial covariance assesses the unique variance shared between two brain regions excluding any widely shared variance, hence is appropriate for the analysis of multivariate fMRI datasets. However, calculation of partial covariance requires inversion of the covariance matrix, which, in most functional connectivity studies, is not invertible owing to rank deficiency. Here we apply Ledoit-Wolf shrinkage (L2 regularization) to invert the high dimensional BOLD covariance matrix. We investigate the network organization and brain-state dependence of partial covariance-based functional connectivity. Although RSNs are conventionally defined in terms of shared variance, removal of widely shared variance, surprisingly, improved the separation of RSNs in a spring embedded graphical model. This result suggests that pair-wise unique shared variance plays a heretofore unrecognized role in RSN covariance organization. In addition, application of partial correlation to fMRI data acquired in the eyes open vs. eyes closed states revealed focal changes in uniquely shared variance between the thalamus and visual cortices. This result suggests that partial correlation of resting state BOLD time series reflect functional processes in addition to structural connectivity.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularized Discriminant Analysis Incorporating Prior Knowledge on Gene Functional Groups

In the last decade, the renaissance of interest in discriminant analysis has been primarily motivated by possible applications to tumor classification using highdimensional microarray-based data. In this thesis, we do three things: 1. First, we introduce a new regularizing covariance estimation procedure we refer to as SHIP: SHrinking and Incorporating Prior knowledge. The resulting covariance ...

متن کامل

Portfolio Optimization via Generalized Multivariate Shrinkage

The shrinkage method of Ledoit and Wolf (2003; 2004a; 2004b) has shown certain success in estimating a well-conditioned covariance matrix for high dimensional portfolios. This paper generalizes the shrinkage method of Ledoit and Wolf to a multivariate shrinkage setting, by which the well-conditioned covariance matrix is estimated using the weighted averaging of multiple priors, instead of singl...

متن کامل

Covariance shrinkage for autocorrelated data

The accurate estimation of covariance matrices is essential for many signal processing and machine learning algorithms. In high dimensional settings the sample covariance is known to perform poorly, hence regularization strategies such as analytic shrinkage of Ledoit/Wolf are applied. In the standard setting, i.i.d. data is assumed, however, in practice, time series typically exhibit strong aut...

متن کامل

A shrinkage approach to large-scale covariance matrix estimation and implications for functional genomics.

Inferring large-scale covariance matrices from sparse genomic data is an ubiquitous problem in bioinformatics. Clearly, the widely used standard covariance and correlation estimators are ill-suited for this purpose. As statistically efficient and computationally fast alternative we propose a novel shrinkage covariance estimator that exploits the Ledoit-Wolf (2003) lemma for analytic calculation...

متن کامل

Some Hypothesis Tests for the Covariance Matrix When the Dimension Is Large Compared to the Sample Size by Olivier Ledoit

This paper analyzes whether standard covariance matrix tests work when dimensionality is large, and in particular larger than sample size. In the latter case, the singularity of the sample covariance matrix makes likelihood ratio tests degenerate, but other tests based on quadratic forms of sample covariance matrix eigenvalues remain well-defined. We study the consistency property and limiting ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • NeuroImage

دوره 121  شماره 

صفحات  -

تاریخ انتشار 2015